Skip to content

Faster Inference (How to process faster) #489

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
vineeth9059 opened this issue Feb 21, 2025 · 0 comments
Open

Faster Inference (How to process faster) #489

vineeth9059 opened this issue Feb 21, 2025 · 0 comments

Comments

@vineeth9059
Copy link

How to increase GPU utilization and Reduce Runtime,

In my case I'm Running this project on NVIDIA RTX A4000 with 16GB GPU, CUDA Version: 12.4, Driver Version: 550.127.08

I have Tried source image and Driving video

Driving video is of 9 seconds, with 24 fps, (216 frames) with audio as well (512 X 512) video resolutions

So, the Total Execution Time for this is 15-17 seconds with less than 3GB GPU, I want to Increase the GPU Utilization and Reduce the Runtime, is there any possibility to do it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant